39 research outputs found

    «Mezh tem vverhu zvezda siyaet…» [“Meanwhile, the top star shines”…]: Zabolotsky and Mirovedenie

    Full text link
    The article was submitted on 20.01.2015.Статья посвящена теме соприкосновения поэтического мира Николая Заболоцкого с деятельностью Русского общества любителей мироведения (Р.О.Л.М.), работавшего в Санкт-Петербурге - Петрограде - Ленинграде с 1909 по 1932 г. В 1928 г. в журнале «Мироведение» была напечатана статья астронома и историка науки Д. О. Святского «Сказание о Чигирь-звезде и телескопические наблюдения Галилея (Из истории астрономии в России)». Она была посвящена переводным русским астрономическим и астрологическим компиляциям XVI-XVII столетий («Сказание царя Соломона, что есть печать большая, откуду, как ему приде» и других), где говорилось о таинственной звезде Чигирь. Поэма Заболоцкого «Безумный волк», написанная в 1931 г., восходит не только к его знакомству с учением К. Э. Циолковского, но и к впечатлениям от чтения этой статьи, а также содержит следы чтения сочинений историков, этнографов и фольклористов, упоминавших о Чигирь-звезде (И. П. Сахаров, А. Н. Афанасьев, А. И. Соболевский, А. С. Ермолов, В. Н. Перетс). Другие упоминания звезд, созвездий, планет, телескопов и астрономов у Заболоцкого (и раннего, и позднего периодов), вероятно, также связаны с памятью о Р.О.Л.М., где прообраз синтетической и всемогущей науки будущего виделся в оккультной «древней науке» и «народной астрономии».The article considers the correspondence between the poetic world of Nikolai Zabolotsky and the activity of the Russian Society of Amateurs of Natural Sciences that worked in Saint Petersburg - Petrograd - Leningrad between 1909 and 1932. In 1928, in the Mirovedenie Journal (Russian for Natural Sciences) astronomer and science historian D. O. Svyatsky published an article entitled A Tale of Star Chigir and the Telescopic Observations of Galileo (From the History of Astronomy in Russia). It considered the Russian translations of the astronomical and astrological compiled works of the 16th and 17th centuries (A Tale of King Solomon of What Is Great Sorrow and Wherefrom It Shall Come and others) which related to the mysterious Star Chigir. Zabolotsky’s poem The Mad Wolf (1931) was not only inspired by the poet’s meeting with K. E. Tsiolkovsky but also his impressions following his reading of the abovementioned article and has data proving his acquaintance with works of historians, ethnographers and folklorists that mentioned Star Chigir (I. P. Sakharov, A. N. Afanasyev, A. I. Sobolevsky, A. S. Yermolov, V. N. Perets). Other instances of Zabolotsky’s mentioning of stars, constellations, planets, telescopes and astronomers (both during the early and late periods of his work) are most likely connected with the Russian Society of Amateurs of Natural Sciences which considered the synthetic and almighty science of the future as rooted in the occult science of the old times and popular astronomy

    Maximum Likelihood-based Online Adaptation of Hyper-parameters in CMA-ES

    Get PDF
    The Covariance Matrix Adaptation Evolution Strategy (CMA-ES) is widely accepted as a robust derivative-free continuous optimization algorithm for non-linear and non-convex optimization problems. CMA-ES is well known to be almost parameterless, meaning that only one hyper-parameter, the population size, is proposed to be tuned by the user. In this paper, we propose a principled approach called self-CMA-ES to achieve the online adaptation of CMA-ES hyper-parameters in order to improve its overall performance. Experimental results show that for larger-than-default population size, the default settings of hyper-parameters of CMA-ES are far from being optimal, and that self-CMA-ES allows for dynamically approaching optimal settings.Comment: 13th International Conference on Parallel Problem Solving from Nature (PPSN 2014) (2014

    Hyperparameter Importance Across Datasets

    Full text link
    With the advent of automated machine learning, automated hyperparameter optimization methods are by now routinely used in data mining. However, this progress is not yet matched by equal progress on automatic analyses that yield information beyond performance-optimizing hyperparameter settings. In this work, we aim to answer the following two questions: Given an algorithm, what are generally its most important hyperparameters, and what are typically good values for these? We present methodology and a framework to answer these questions based on meta-learning across many datasets. We apply this methodology using the experimental meta-data available on OpenML to determine the most important hyperparameters of support vector machines, random forests and Adaboost, and to infer priors for all their hyperparameters. The results, obtained fully automatically, provide a quantitative basis to focus efforts in both manual algorithm design and in automated hyperparameter optimization. The conducted experiments confirm that the hyperparameters selected by the proposed method are indeed the most important ones and that the obtained priors also lead to statistically significant improvements in hyperparameter optimization.Comment: \c{opyright} 2018. Copyright is held by the owner/author(s). Publication rights licensed to ACM. This is the author's version of the work. It is posted here for your personal use, not for redistribution. The definitive Version of Record was published in Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Minin

    Community detection‐based deep neural network architectures: A fully automated framework based on Likert‐scale data.

    Get PDF
    Deep neural networks (DNNs) have emerged as a state‐of‐the‐art tool in very different research fields due to its adaptive power to the decision space since they do not presuppose any linear relationship between data. Some of the main disadvantages of these trending models are that the choice of the network underlying architecture profoundly influences the performance of the model and that the architecture design requires prior knowledge of the field of study. The use of questionnaires is hugely extended in social/behavioral sciences. The main contribution of this work is to automate the process of a DNN architecture design by using an agglomerative hierarchical algorithm that mimics the conceptual structure of such surveys. Although the train had regression purposes, it is easily convertible to deal with classification tasks. Our proposed methodology will be tested with a database containing socio‐demographic data and the responses to five psychometric Likert scales related to the prediction of happiness. These scales have been already used to design a DNN architecture based on the subdimension of the scales. We show that our new network configurations outperform the previous existing DNN architectures

    Vladimir Mayakovsky as "a Singing Ilya Muromets": "Karacharovo"by Viktor Sosnora

    Get PDF
    В статье представлен анализ стихотворения Виктора Сосноры "Карачарово", в котором главными героями являются былинный богатырь Илья Муромец и поэт Владимир Маяковский.The article analyzes the poem by Viktor Sosnora "Karacharovo". The poet entered the world of literature with a book of poems on Old Russia (“Horsemen”). He knew well not only written monuments, but also folklore, was well-read in folklore studies. At the same time, Sosnora felt himself the heir toRussian futurism. The main characters of the poem - the epic hero Ilya Muromets and the poet Vladimir Mayakovsky - are hidden through the figure of silence

    Towards Better Integration of Surrogate Models and Optimizers

    Get PDF
    Surrogate-Assisted Evolutionary Algorithms (SAEAs) have been proven to be very effective in solving (synthetic and real-world) computationally expensive optimization problems with a limited number of function evaluations. The two main components of SAEAs are: the surrogate model and the evolutionary optimizer, both of which use parameters to control their respective behavior. These parameters are likely to interact closely, and hence the exploitation of any such relationships may lead to the design of an enhanced SAEA. In this chapter, as a first step, we focus on Kriging and the Efficient Global Optimization (EGO) framework. We discuss potentially profitable ways of a better integration of model and optimizer. Furthermore, we investigate in depth how different parameters of the model and the optimizer impact optimization results. In particular, we determine whether there are any interactions between these parameters, and how the problem characteristics impact optimization results. In the experimental study, we use the popular Black-Box Optimization Benchmarking (BBOB) testbed. Interestingly, the analysis finds no evidence for significant interactions between model and optimizer parameters, but independently their performance has a significant interaction with the objective function. Based on our results, we make recommendations on how best to configure EGO

    From evolutionary computation to the evolution of things

    Get PDF
    Evolution has provided a source of inspiration for algorithm designers since the birth of computers. The resulting field, evolutionary computation, has been successful in solving engineering tasks ranging in outlook from the molecular to the astronomical. Today, the field is entering a new phase as evolutionary algorithms that take place in hardware are developed, opening up new avenues towards autonomous machines that can adapt to their environment. We discuss how evolutionary computation compares with natural evolution and what its benefits are relative to other computing approaches, and we introduce the emerging area of artificial evolution in physical systems

    Internal stresses in the cranial bone tissue

    No full text
    corecore